
Security News
Django Joins curl in Pushing Back on AI Slop Security Reports
Django has updated its security policies to reject AI-generated vulnerability reports that include fabricated or unverifiable content.
csv-streamify
Advanced tools
Parses csv files. Accepts options. Handles weird encodings. No coffee script, no weird APIs. Just streams. Tested against csv-spectrum and used in production.
npm install csv-streamify
This module implements a simple node 0.10.x stream.Transform stream.
Note: csv-streamify pulls in the readable-stream
module, so it also works on node 0.8
var csv = require('csv-streamify'),
fs = require('fs')
var fstream = fs.createReadStream('/path/to/file'),
parser = csv(options /* optional */, callback /* optional */)
// emits each line as a buffer or as a string representing an array of fields
parser.on('readable', function () {
var line = parser.read()
// do stuff with data as it comes in
// current line number
console.log(parser.lineNo)
})
// AND/OR
function callback(err, doc) {
if (err) return handleErrorGracefully(err)
// doc is an array of row arrays
doc.forEach(function (row) { console.log(row) })
}
// now pump some data into it (and pipe it somewhere else)
fstream.pipe(parser).pipe(nirvana)
Note: If you pass a callback to csv-streamify
it will buffer the parsed data for you and pass it to the callback when it's done. This behaviour can obviously lead to out of memory errors with very large csv files.
You can pass some options to the parser. All of them are optional.
The options are also passed to the underlying transform stream, so you can pass in any standard node core stream options.
{
delimiter: ',', // comma, semicolon, whatever
newline: '\n', // newline character (use \r\n for CRLF files)
quote: '"', // what's considered a quote
empty: '', // empty fields are replaced by this,
// specify the encoding of the source if it's something other than utf8
inputEncoding: '',
// if true, emit arrays instead of stringified arrays or buffers
objectMode: false,
// if set to true, uses first row as keys -> [ { column1: value1, column2: value2 }, ...]
columns: false
}
In order for the inputEncoding option to take effect you need to install iconv-lite (npm install iconv-lite --save
).
Also, take a look at the iconv-lite documentation for supported encodings.
(iconv-lite provides pure javascript character encoding conversion -> no native code compilation)
To use on the command line install it globally:
$ npm install csv-streamify -g
This should add the csv-streamify
command to your $PATH
.
Then, you either pipe data into it or give it a filename:
# pipe data in
$ cat some_data.csv | csv-streamify
# pass a filename
$ csv-streamify some_data.csv > output.json
# tell csv-streamify to read from + wait on stdin
$ csv-streamify -
If you would like to contribute either of those just open an issue so we can discuss it further. :)
Nicolas Hery (objectMode)
FAQs
Streaming CSV Parser. Made entirely out of streams.
The npm package csv-streamify receives a total of 61,641 weekly downloads. As such, csv-streamify popularity was classified as popular.
We found that csv-streamify demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Django has updated its security policies to reject AI-generated vulnerability reports that include fabricated or unverifiable content.
Security News
ECMAScript 2025 introduces Iterator Helpers, Set methods, JSON modules, and more in its latest spec update approved by Ecma in June 2025.
Security News
A new Node.js homepage button linking to paid support for EOL versions has sparked a heated discussion among contributors and the wider community.